AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
State Space Model

# State Space Model

Mamba 7b Rw
Apache-2.0
Mamba-7B is a 7-billion-parameter model based on the Mamba architecture, trained over multiple rounds on the RefinedWeb dataset (1.2 trillion tokens). Mamba is a state space model that does not use self-attention mechanisms and excels in various natural language benchmarks.
Large Language Model English
M
TRI-ML
188
55
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase